9 research outputs found

    Blockchain-IoT peer device storage optimization using an advanced time-variant multi-objective particle swarm optimization algorithm

    Get PDF
    The integration of Internet of Things devices onto the Blockchain implies an increase in the transactions that occur on the Blockchain, thus increasing the storage requirements. A solution approach is to leverage cloud resources for storing blocks within the chain. The paper, therefore, proposes two solutions to this problem. The first being an improved hybrid architecture design which uses containerization to create a side chain on a fog node for the devices connected to it and an Advanced Time‑variant Multi‑objective Particle Swarm Optimization Algorithm (AT‑MOPSO) for determining the optimal number of blocks that should be transferred to the cloud for storage. This algorithm uses time‑variant weights for the velocity of the particle swarm optimization and the non‑dominated sorting and mutation schemes from NSGA‑III. The proposed algorithm was compared with results from the original MOPSO algorithm, the Strength Pareto Evolutionary Algorithm (SPEA‑II), and the Pareto Envelope‑based Selection Algorithm with region‑based selection (PESA‑II), and NSGA‑III. The proposed AT‑MOPSO showed better results than the aforementioned MOPSO algorithms in cloud storage cost and query probability optimization. Importantly, AT‑MOPSO achieved 52% energy efficiency compared to NSGA‑III. To show how this algorithm can be applied to a real‑world Blockchain system, the BISS industrial Blockchain architecture was adapted and modified to show how the AT‑MOPSO can be used with existing Blockchain systems and the benefits it provides

    On Distributed Denial of Service Current Defense Schemes

    No full text
    Distributed denial of service (DDoS) attacks are a major threat to any network-based service provider. The ability of an attacker to harness the power of a lot of compromised devices to launch an attack makes it even more complex to handle. This complexity can increase even more when several attackers coordinate to launch an attack on one victim. Moreover, attackers these days do not need to be highly skilled to perpetrate an attack. Tools for orchestrating an attack can easily be found online and require little to no knowledge about attack scripts to initiate an attack. Studies have been done severally to develop defense mechanisms to detect and defend against DDoS attacks. As defense schemes are designed and developed, attackers are also on the move to evade these defense mechanisms and so there is a need for a continual study in developing defense mechanisms. This paper discusses the current DDoS defense mechanisms, their strengths and weaknesses

    A Proposed DoS Detection Scheme for Mitigating DoS Attack Using Data Mining Techniques

    No full text
    A denial of service (DoS) attack in a computer network is an attack on the availability of computer resources to prevent users from having access to those resources over the network. Denial of service attacks can be costly, capable of reaching $100,000 per hour. Development of easily-accessible, simple DoS tools has increased the frequency and reduced the level of expertise needed to launch an attack. Though these attack tools have been available for years, there has been no proposed defense mechanism targeted specifically at them. Most defense mechanisms in literature are designed to defend attacks captured in datasets like the KDD Cup 99 dataset from 20 years ago and from tools no longer in use in modern attacks. In this paper, we capture and analyze traffic generated by some of these DoS attack tools using Wireshark Network Analyzer and propose a signature-based DoS detection mechanism based on SVM classifier to defend against attacks launched by these attack tools. Our proposed detection mechanism was tested with Snort IDS and compared with some already existing defense mechanisms in literature and had a high detection accuracy, low positive rate and fast detection time

    Multi-Agent Reinforcement Learning Framework in SDN-IoT for Transient Load Detection and Prevention

    No full text
    The fast emergence of IoT devices and its accompanying big and complex data has necessitated a shift from the traditional networking architecture to software-defined networks (SDNs) in recent times. Routing optimization and DDoS protection in the network has become a necessity for mobile network operators in maintaining a good QoS and QoE for customers. Inspired by the recent advancement in Machine Learning and Deep Reinforcement Learning (DRL), we propose a novel MADDPG integrated Multiagent framework in SDN for efficient multipath routing optimization and malicious DDoS traffic detection and prevention in the network. The two MARL agents cooperate within the same environment to accomplish network optimization task within a shorter time. The state, action, and reward of the proposed framework were further modelled mathematically using the Markov Decision Process (MDP) and later integrated into the MADDPG algorithm. We compared the proposed MADDPG-based framework to DDPG for network metrics: delay, jitter, packet loss rate, bandwidth usage, and intrusion detection. The results show a significant improvement in network metrics with the two agents

    A 100 Gbps OFDM-Based 28 GHz Millimeter-Wave Radio over Fiber Fronthaul System for 5G

    No full text
    Due to the unprecedented growth in mobile data traffic, emerging mobile access networks such as fifth-generation (5G) would require huge bandwidth and a mobile fronthaul architecture as an essential solution in providing a high capacity for support in the future. To increase capacity, utilizing millimeter waves (mm-waves) in an analog radio over fiber (RoF) fronthaul link is the major advancement and solution in achieving higher bandwidth and high data rate to cater for 5G mobile communication. In this paper, we demonstrate the feasibility of transmission and reception of a 100 Gbits/s data rate link at 28 GHz. The performance of three modulation formats (16-PSK, 16-QAM and 64-QAM) have been compared for an optical fiber length from 5 km up to 35 km for two detection systems; coherent and direct detection. Also, in this paper, the transmission impairments inherent to transmission systems are realized through the implementation of a digital signal processing (DSP) compensation scheme in the receiver system to enhance system performance. Quality factor (QF) and bit error rate (BER) are used as metrics to evaluate the system performance. The proposed system model is designed and simulated using Optisystem 16

    A Lightweight Messaging Protocol for Internet of Things Devices

    No full text
    The move towards intelligent systems has led to the evolution of IoT. This technological leap has over the past few years introduced significant improvements to various aspects of the human environment, such as health, commerce, transport, etc. IoT is data-centric; hence, it is required that the underlying protocols are scalable and sufficient to support the vast D2D communication. Several application layer protocols are being used for M2M communication protocols such as CoAP, MQTT, etc. Even though these messaging protocols have been designed for M2M communication, they are still not optimal for communications where message size and overhead are of much concern. This research paper presents a Lightweight Messaging Protocol (LiMP), which is a minified version of CoAP. We present a detailed protocol stack of the proposed messaging protocol and also perform a benchmark analysis of the protocol on some IoT devices. The proposed minified protocol achieves minimal overhead (a header size of 2 bytes) and has faster point-to-point communication from the benchmark analysis; for communication over LAN, the LiMP-TCP outperformed the CoAP-TCP by an average of 21% whereas that of LiMP-UDP was over 37%. For a device to remote server communication, LiMP outperformed CoAP by an average of 15%

    An Investigation into the Application of Deep Learning in the Detection and Mitigation of DDOS Attack on SDN Controllers

    No full text
    Software-Defined Networking (SDN) is a new paradigm that revolutionizes the idea of a software-driven network through the separation of control and data planes. It addresses the problems of traditional network architecture. Nevertheless, this brilliant architecture is exposed to several security threats, e.g., the distributed denial of service (DDoS) attack, which is hard to contain in such software-based networks. The concept of a centralized controller in SDN makes it a single point of attack as well as a single point of failure. In this paper, deep learning-based models, long-short term memory (LSTM) and convolutional neural network (CNN), are investigated. It illustrates their possibility and efficiency in being used in detecting and mitigating DDoS attack. The paper focuses on TCP, UDP, and ICMP flood attacks that target the controller. The performance of the models was evaluated based on the accuracy, recall, and true negative rate. We compared the performance of the deep learning models with classical machine learning models. We further provide details on the time taken to detect and mitigate the attack. Our results show that RNN LSTM is a viable deep learning algorithm that can be applied in the detection and mitigation of DDoS in the SDN controller. Our proposed model produced an accuracy of 89.63%, which outperformed linear-based models such as SVM (86.85%) and Naive Bayes (82.61%). Although KNN, which is a linear-based model, outperformed our proposed model (achieving an accuracy of 99.4%), our proposed model provides a good trade-off between precision and recall, which makes it suitable for DDoS classification. In addition, it was realized that the split ratio of the training and testing datasets can give different results in the performance of a deep learning algorithm used in a specific work. The model achieved the best performance when a split of 70/30 was used in comparison to 80/20 and 60/40 split ratios

    On Blockchain and IoT Integration Platforms: Current Implementation Challenges and Future Perspectives

    No full text
    Digitization and automation have engulfed every scope and sphere of life. Internet of Things (IoT) has been the main enabler of the revolution. There still exist challenges in IoT that need to be addressed such as the limited address space for the increasing number of devices when using IPv4 and IPv6 as well as key security issues such as vulnerable access control mechanisms. Blockchain is a distributed ledger technology that has immense benefits such as enhanced security and traceability. Thus, blockchain can serve as a good foundation for applications based on transaction and interactions. IoT implementations and applications are by definition distributed. This means blockchain can help to solve most of the security vulnerabilities and traceability concerns of IoTs by using blockchain as a ledger that can keep track of how devices interact, in which state they are and how they transact with other IoT devices. IoT applications have been mainly implemented with technologies such as cloud and fog computing, and AI to help address some of its key challenges. The key implementation challenges and technical choices to consider in making a successful blockchain IoT (BIoT) project are clearly outlined in this paper. The security and privacy aspect of BIoT applications are also analyzed, and several relevant solutions to improve the scalability and throughput of such applications are proposed. The paper also reviews integration schemes and monitoring frameworks for BIoT applications. A hybrid blockchain IoT integration architecture that makes use of containerization is proposed

    A Survey on Network Optimization Techniques for Blockchain Systems

    No full text
    The increase of the Internet of Things (IoT) calls for secure solutions for industrial applications. The security of IoT can be potentially improved by blockchain. However, blockchain technology suffers scalability issues which hinders integration with IoT. Solutions to blockchain’s scalability issues, such as minimizing the computational complexity of consensus algorithms or blockchain storage requirements, have received attention. However, to realize the full potential of blockchain in IoT, the inefficiencies of its inter-peer communication must also be addressed. For example, blockchain uses a flooding technique to share blocks, resulting in duplicates and inefficient bandwidth usage. Moreover, blockchain peers use a random neighbor selection (RNS) technique to decide on other peers with whom to exchange blockchain data. As a result, the peer-to-peer (P2P) topology formation limits the effective achievable throughput. This paper provides a survey on the state-of-the-art network structures and communication mechanisms used in blockchain and establishes the need for network-based optimization. Additionally, it discusses the blockchain architecture and its layers categorizes existing literature into the layers and provides a survey on the state-of-the-art optimization frameworks, analyzing their effectiveness and ability to scale. Finally, this paper presents recommendations for future work
    corecore